187 research outputs found

    Communication Efficiency in Self-stabilizing Silent Protocols

    Get PDF
    Self-stabilization is a general paradigm to provide forward recovery capabilities to distributed systems and networks. Intuitively, a protocol is self-stabilizing if it is able to recover without external intervention from any catastrophic transient failure. In this paper, our focus is to lower the communication complexity of self-stabilizing protocols \emph{below} the need of checking every neighbor forever. In more details, the contribution of the paper is threefold: (i) We provide new complexity measures for communication efficiency of self-stabilizing protocols, especially in the stabilized phase or when there are no faults, (ii) On the negative side, we show that for non-trivial problems such as coloring, maximal matching, and maximal independent set, it is impossible to get (deterministic or probabilistic) self-stabilizing solutions where every participant communicates with less than every neighbor in the stabilized phase, and (iii) On the positive side, we present protocols for coloring, maximal matching, and maximal independent set such that a fraction of the participants communicates with exactly one neighbor in the stabilized phase

    Author manuscript, published in "Workshop Data Analysis and Interpretation for Learning Environments (2013)" Share data treatments and analysis processes in Technology enhanced learning.

    No full text
    In the context of our research team (multidisciplinary with numerous and various TEL systems), we have been working during the last three years on the design and implementation of an open platform to collect, save and share experimental data drawn from the interaction with TEL systems, which could build, save and share analysis processes executed on these data. From our point of view both data and analysis processes are worth to be stored and shared, and moreover have to be joined in a unique repository to get the whole picture. This communication presents the analysis processes part of the project. Sharing analysis processes, i.e. the whole complex process, is rather unusual, whereas contemporary platforms or software already propose generic algorithms to work on data (for instance with a statistical point of view or a data mining point of view). Hence, we attempt to model the main concepts of global treatments for experimental data analysis in order to collect, execute, save and then share them in a platform, dedicated to TEL Systems. The execution part is the most difficult and constraining part of our work. This needs to be implemented with a complex architecture. An important part of the communication is so devoted to the description of the architecture, and to the link between the global point of view of the whole process and the local point of view of elementary or specific algorithms used during the process. A short, but realistic, example of application of our platform is given, with the definition of a global process and the definition of an elementary algorithm used in the global process. The process is executed on real data leading to a graphical display of results, which are then briefly analyzed. 1

    Grenoble- RhĂ´ne-Alpes THEME

    No full text
    Computational models and simulationTable of content
    • …
    corecore